Data Dropout in Arbitrary Basis for Deep Network Regularization

نویسندگان

  • Mostafa Rahmani
  • George Atia
چکیده

An important problem in training deep networks with high capacity is to ensure that the trained network works well with new inputs. Dropout is an effective regularization technique to boost the network’s generalization in which a random subset of the elements of given data and the extracted features are set to zero during the training process. In this paper, a new randomized regularization technique is proposed in which we withhold a random part of data without necessarily turning off the neurons/data-elements. In the proposed method, of which the conventional dropout is shown to be a special case, random data dropout is performed in an arbitrary basis. In addition, we present a framework to efficiently apply the proposed technique to the convolutional neural networks. The presented numerical experiments show that the proposed technique yields notable performance gain. The proposed approach, dubbed Generalized Dropout, provides a deep insight into the idea of dropout, shows that we can achieve different performance gains using different basis matrices, and opens up a new research question as of how to choose optimal basis matrices that achieve maximal performance gain.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularization of Deep Neural Networks with Spectral Dropout

The big breakthrough on the ImageNet challenge in 2012 was partially due to the ‘dropout’ technique used to avoid overfitting. Here, we introduce a new approach called ‘Spectral Dropout’ to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed ba...

متن کامل

Regularization for Unsupervised Deep Neural Nets

Unsupervised neural networks, such as restricted Boltzmann machines (RBMs) and deep belief networks (DBNs), are powerful tools for feature selection and pattern recognition tasks. We demonstrate that overfitting occurs in such models just as in deep feedforward neural networks, and discuss possible regularization methods to reduce overfitting. We also propose a “partial” approach to improve the...

متن کامل

To Drop or Not to Drop: Robustness, Consistency and Differential Privacy Properties of Dropout

Training deep belief networks (DBNs) requires optimizing a non-convex function with an extremely large number of parameters. Naturally, existing gradient descent (GD) based methods are prone to arbitrarily poor local minima. In this paper, we rigorously show that such local minima can be avoided (upto an approximation error) by using the dropout technique, a widely used heuristic in this domain...

متن کامل

Conditional Generative Adversarial Nets Classifier for Spoken Language Identification

The i-vector technique using deep neural network has been successfully applied in spoken language identification systems. Neural network modeling showed its effectiveness as both discriminant feature transformation and classification in many tasks, in particular with a large training data set. However, on a small data set, neural networks suffer from the overfitting problem which degrades the p...

متن کامل

Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks

We propose the simple and efficient method of semi-supervised learning for deep neural networks. Basically, the proposed network is trained in a supervised fashion with labeled and unlabeled data simultaneously. For unlabeled data, Pseudo-Labels, just picking up the class which has the maximum predicted probability, are used as if they were true labels. This is in effect equivalent to Entropy R...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1712.00891  شماره 

صفحات  -

تاریخ انتشار 2017